Sensor Fusion for Augmented Reality
نویسندگان
چکیده
منابع مشابه
Sensor Fusion for Augmented Reality
In this paper we describe in detail our sensor fusion framework for augmented reality applications. We combine inertia sensors with a compass, DGPS and a camera to determine the position of the user’s head. We use two separate extended complementary Kalman filters for orientation and position. The orientation filter uses quaternions for stable representation of the orientation.
متن کاملSimulation-Based Augmented Reality for Sensor Network
Software development for sensor network is made difficult by resource constrained sensor devices, distributed system complexity, communication unreliability, and high labor cost. Simulation, as a useful tool, provides an affordable way to study algorithmic problems with flexibility and controllability. However, in exchange for speed simulation often trades detail that ultimately limits its util...
متن کاملSmartphone Sensor Reliability for Augmented Reality Applications
With increasing reliance on the location and orientation sensors in smartphones for not only augmented reality applications, but also for meeting government-mandated emergency response requirements, the reliability of these sensors is a matter of great importance. Previous studies measure the accuracy of the location sensing, typically GPS, in handheld devices including smartphones, but few stu...
متن کاملMotion model transitions in GPS-IMU sensor fusion for user tracking in augmented reality
Finding the position of the user is an important processing step for augmented reality (AR) applications. This paper investigates the use of different motion models in order to choose the most suitable one, and eventually reduce the Kalman filter errors in sensor fusion for such applications where the accuracy of user tracking is crucial. A Deterministic Finite Automaton (DFA) was employed usin...
متن کاملSensor Fusion for Augmented Reality, Report no. LiTH-ISY-R-2875
The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear ltering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the p...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IFAC Proceedings Volumes
سال: 2008
ISSN: 1474-6670
DOI: 10.3182/20080706-5-kr-1001.02390